69 research outputs found

    Second Generation Phenyloxadiazolyl Methyl Sulfones for Thiol-Specific Bioconjugations

    Full text link
    The role of antibody-based molecular agents for diagnosis and therapy of cancer has expanded significantly over the past decades. However, most of these constructs are synthesized using traditional bioconjugation methods based on the random ligations between the molecular cargo and lysine residues within the protein. These non-specific approaches can create poorly defined conjugates with suboptimal immunoreactivity and in vivo performance while Site-specific approaches to antibody bioconjugation based on ligations between maleimides and free cysteine residues have long stood as attractive alternatives. Yet the inherent instability of the thiol-maleimide linkage has fueled the search for new, more stable thiol-reactive prosthetic groups. One particularly promising solution is the use of a novel bioconjugation reagent based on a phenyloxadiazolyl methyl sulfone (PODS) scaffold that selectively reacts with free cysteines to form more stable and well- defined immunoconjugates. The first four chapters of thesis focus on the development, optimization, and evaluation of PODS-based bioconjugation strategies for the generation of immuno- and radioimmunoconjugates with improved immunoreactivity. The fifth and final chapter describe the preparation of antibody-based gold(I)-compounds with significant cytotoxicity in HER2-positive breast cancer cells

    Rigid and Articulated Point Registration with Expectation Conditional Maximization

    Get PDF
    This paper addresses the issue of matching rigid and articulated shapes through probabilistic point registration. The problem is recast into a missing data framework where unknown correspondences are handled via mixture models. Adopting a maximum likelihood principle, we introduce an innovative EM-like algorithm, namely the Expectation Conditional Maximization for Point Registration (ECMPR) algorithm. The algorithm allows the use of general covariance matrices for the mixture model components and improves over the isotropic covariance case. We analyse in detail the associated consequences in terms of estimation of the registration parameters, and we propose an optimal method for estimating the rotational and translational parameters based on semi-definite positive relaxation. We extend rigid registration to articulated registration. Robustness is ensured by detecting and rejecting outliers through the addition of a uniform component to the Gaussian mixture model at hand. We provide an in-depth analysis of our method and we compare it both theoretically and experimentally with other robust methods for point registration

    Rigid and Articulated Point Registration with Expectation Conditional Maximization

    Get PDF
    International audienceThis paper addresses the issue of matching rigid and articulated shapes through probabilistic point registration. The problem is recast into a missing data framework where unknown correspondences are handled via mixture models. Adopting a maximum likelihood principle, we introduce an innovative EM-like algorithm, namely the Expectation Conditional Maximization for Point Registration (ECMPR) algorithm. The algorithm allows the use of general covariance matrices for the mixture model components and improves over the isotropic covariance case. We analyse in detail the associated consequences in terms of estimation of the registration parameters, and we propose an optimal method for estimating the rotational and translational parameters based on semi-definite positive relaxation. We extend rigid registration to articulated registration. Robustness is ensured by detecting and rejecting outliers through the addition of a uniform component to the Gaussian mixture model at hand. We provide an in-depth analysis of our method and we compare it both theoretically and experimentally with other robust methods for point registration

    Virtual Clay for Direct Hand Manipulation

    Get PDF
    International audienceIn order to make virtual modeling as easy as real clay manipulation, we describe a realtime virtual clay model, specially designed for direct hand manipulation. We build on a previous layered model for clay, extending it to handle local properties such as colour or fluidity, to deal with an arbitrary number of tools, and to capture twist effects due to rotating tools. The resulting clay model is the first step towards a more long term goal, namely direct interaction through video tracking of the user's hands

    Investigating self-similarity and heavy tailed distributions on a large scale experimental facility

    Get PDF
    After seminal work by Taqqu et al. relating self-similarity to heavy tail distributions, a number of research articles verified that aggregated Internet traffic time series show self-similarity and that Internet attributes, like WEB file sizes and flow lengths, were heavy tailed. However, the validation of the theoretical prediction relating self-similarity and heavy tails remains unsatisfactorily addressed, being investigated either using numerical or network simulations, or from uncontrolled web traffic data. Notably, this prediction has never been conclusively verified on real networks using controlled and stationary scenarii, prescribing specific heavy-tail distributions, and estimating confidence intervals. In the present work, we use the potential and facilities offered by the large-scale, deeply reconfigurable and fully controllable experimental Grid5000 instrument, to investigate the prediction observability on real networks. To this end we organize a large number of controlled traffic circulation sessions on a nation-wide real network involving two hundred independent hosts. We use a FPGA-based measurement system, to collect the corresponding traffic at packet level. We then estimate both the self-similarity exponent of the aggregated time series and the heavy-tail index of flow size distributions, independently. Comparison of these two estimated parameters, enables us to discuss the practical applicability conditions of the theoretical prediction

    Investigating self-similarity and heavy-tailed distributions on a large scale experimental facility

    Get PDF
    International audienceAfter the seminal work by Taqqu et al. relating selfsimilarity to heavy-tailed distributions, a number of research articles verified that aggregated Internet traffic time series show self-similarity and that Internet attributes, like Web file sizes and flow lengths, were heavy-tailed. However, the validation of the theoretical prediction relating self-similarity and heavy tails remains unsatisfactorily addressed, being investigated either using numerical or network simulations, or from uncontrolled Web traffic data. Notably, this prediction has never been conclusively verified on real networks using controlled and stationary scenarii, prescribing specific heavy-tailed distributions, and estimating confidence intervals. With this goal in mind, we use the potential and facilities offered by the large-scale, deeply reconfigurable and fully controllable experimental Grid5000 instrument, to investigate the prediction observability on real networks. To this end we organize a large number of controlled traffic circulation sessions on a nation-wide real network involving two hundred independent hosts. We use a FPGA-based measurement system, to collect the corresponding traffic at packet level. We then estimate both the self-similarity exponent of the aggregated time series and the heavy-tail index of flow size distributions, independently. On the one hand, our results complement and validate with a striking accuracy some conclusions drawn from a series of pioneer studies. On the other hand, they bring in new insights on the controversial role of certain components of real networks

    Unsupervised host behavior classification from connection patterns

    Get PDF
    International audienceA novel host behavior classification approach is proposed as a preliminary step toward traffic classification and anomaly detection in network communication. Though many attempts described in the literature were devoted to flow or application classifications, these approaches are not always adaptable to operational constraints of traffic monitoring (expected to work even without packet payload, without bidirectionality, on highspeed networks or from flow reports only...). Instead, the classification proposed here relies on the leading idea that traffic is relevantly analyzed in terms of host typical behaviors: typical connection patterns of both legitimate applications (data sharing, downloading,...) and anomalous (eventually aggressive) behaviors are obtained by profiling traffic at the host level using unsupervised statistical classification. Classification at the host level is not reducible to flow or application classification, and neither is the contrary: they are different operations which might have complementary roles in network management. The proposed host classification is based on a nine-dimensional feature space evaluating host Internet connectivity, dispersion and exchanged traffic content. A Minimum Spanning Tree (MST) clustering technique is developed that does not require any supervised learning step to produce a set of statistically established typical host behaviors. Not relying on a priori defined classes of known behaviors enables the procedure to discover new host behaviors, that potentially were never observed before. This procedure is applied to traffic collected over the entire year 2008 on a transpacific (Japan/USA) link. A cross-validation of this unsupervised classification against a classical port-based inspection and a state-of-the-art method provides assessment of the meaningfulness and the relevance of the obtained classes for host behaviors

    Inverse electron demand Diels-Alder click chemistry for pretargeted PET imaging and radioimmunotherapy

    Get PDF
    This approach leverages the rapid, bio-orthogonal inverse electron demand Diels-Alder reaction between a radiolabeled tetrazine and a trans-cyclooctene-bearing antibody to enable pretargeted positron emission tomography imaging and endoradiotherapy in a murine model of cancer. Radiolabeled antibodies have shown promise as tools for both the nuclear imaging and endoradiotherapy of cancer, but the protracted circulation time of radioimmunoconjugates can lead to high radiation doses to healthy tissues. To circumvent this issue, we have developed an approach to positron emission tomography (PET) imaging and radioimmunotherapy (RIT) predicated on radiolabeling the antibody after it has reached its target within the body. This in vivo pretargeting strategy is based on the rapid and bio-orthogonal inverse electron demand Diels-Alder reaction between tetrazine (Tz) and trans-cyclooctene (TCO). Pretargeted PET imaging and RIT using TCO-modified antibodies in conjunction with Tz-bearing radioligands produce high activity concentrations in target tissues as well as reduced radiation doses to healthy organs compared to directly labeled radioimmunoconjugates. Herein, we describe how to prepare a TCO-modified antibody (humanized A33-TCO) as well as how to synthesize two Tz-bearing radioligands: one labeled with the positron-emitting radiometal copper-64 ([Cu-64]Cu-SarAr-Tz) and one labeled with the beta-emitting radiolanthanide lutetium-177 ([Lu-177]Lu-DOTA-PEG(7)-Tz). We also provide a detailed description of pretargeted PET and pretargeted RIT experiments in a murine model of human colorectal carcinoma. Proper training in both radiation safety and the handling of laboratory mice is required for the successful execution of this protocol.Peer reviewe
    • …
    corecore